Feature selection methods are usually evaluated by wrapping specificclassifiers and datasets in the evaluation process, resulting very often inunfair comparisons between methods. In this work, we develop a theoreticalframework that allows obtaining the true feature ordering of two-dimensionalsequential forward feature selection methods based on mutual information, whichis independent of entropy or mutual information estimation methods,classifiers, or datasets, and leads to an undoubtful comparison of the methods.Moreover, the theoretical framework unveils problems intrinsic to some methodsthat are otherwise difficult to detect, namely inconsistencies in theconstruction of the objective function used to select the candidate features,due to various types of indeterminations and to the possibility of the entropyof continuous random variables taking null and negative values.
展开▼